Skip to content

feat: add first-class GLM (Zhipu AI) provider support#2

Open
garyblankenship wants to merge 2 commits intovoocel:mainfrom
garyblankenship:feat/glm-provider
Open

feat: add first-class GLM (Zhipu AI) provider support#2
garyblankenship wants to merge 2 commits intovoocel:mainfrom
garyblankenship:feat/glm-provider

Conversation

@garyblankenship
Copy link
Copy Markdown

Summary

  • Add native GLM provider support using litellm's registered glm provider, which correctly handles Zhipu's /api/paas/v4/chat/completions endpoint path (the OpenAI provider incorrectly appends /v1/)
  • Register "zhipu" as a known provider type (auto-resolves to "glm" protocol)
  • Add ZHIPU_API_KEY / ZHIPU_BASE_URL environment variable fallback

Configuration

{
  "provider": "zhipu",
  "model": "glm-5.1",
  "providers": {
    "zhipu": {
      "base_url": "https://api.z.ai/api/coding/paas/v4",
      "models": ["glm-5.1", "glm-5", "glm-4.7", "glm-4.5-flash"],
      "small_model": "glm-4.5-flash"
    }
  }
}

API key via environment: export ZHIPU_API_KEY=your-key

Depends on #1

Test plan

  • Configure zhipu provider with ZHIPU_API_KEY env var and verify chat completions work with glm-5.1
  • Verify requests hit api.z.ai (not OpenAI) by checking the error message on an invalid key
  • Verify existing providers (openai, anthropic, gemini, openrouter) are unaffected
  • go build ./... and go test ./... pass

Gary Blankenship added 2 commits April 6, 2026 17:31
ProviderCredentials discarded the entire provider config (including
base_url) when api_key was empty, falling back to EnvCredentials which
returned no base URL for custom providers. Requests then hit the
default OpenAI endpoint instead of the configured one.

Also fix ensureProviderSetup to try env var fallback before erroring
when a config file exists but api_key is empty.
Wire up the native litellm GLM provider instead of routing through
the OpenAI provider, which incorrectly appends /v1/ to base URLs.

- Add "zhipu" to KnownProviderTypes (maps to "glm" protocol)
- Add ZHIPU_API_KEY / ZHIPU_BASE_URL environment variable support
- Add "glm" case in provider factory using litellm's registered
  GLM provider with correct endpoint path handling

Configuration example:
  "zhipu": {
    "base_url": "https://api.z.ai/api/coding/paas/v4",
    "models": ["glm-5.1", "glm-5", "glm-4.7"]
  }
Copilot AI review requested due to automatic review settings April 6, 2026 21:33
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds first-class support for Zhipu AI / GLM by routing a new "glm" provider type through LiteLLM, exposing it via a "zhipu" provider key, and improving env-var credential fallback behavior.

Changes:

  • Add "glm" provider type handling via a LiteLLM-backed adapter in model creation.
  • Register "zhipu" as a known provider that resolves to "glm" protocol and add ZHIPU_API_KEY / ZHIPU_BASE_URL env var support.
  • Adjust bootstrap credential checks to allow env-var fallback even when a config file exists but has an empty api_key.

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.

File Description
internal/provider/provider.go Adds GLM model creation path using LiteLLM provider client.
internal/config/settings.go Adds "zhipu" -> "glm" mapping and Zhipu env var credential support; preserves config base URL when key comes from env.
internal/bootstrap/input.go Allows env fallback for API key even when settings files exist but credentials aren’t configured in-file.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@@ -18,6 +18,7 @@ var KnownProviderTypes = map[string]string{
"openai": "openai",
"openrouter": "openrouter",
"gemini": "gemini",
Copy link

Copilot AI Apr 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

provider.go now supports the provider type "glm", but KnownProviderTypes doesn't include a "glm" entry. If a user sets settings.provider to "glm" (mirroring how other providers work), ProviderType() will currently default to "openai" and route through the OpenAI client instead of the GLM/LiteLLM path. Consider adding "glm": "glm" here (and aligning env var lookup accordingly) so "glm" works as a first-class provider key, not only as an inferred type for "zhipu".

Suggested change
"gemini": "gemini",
"gemini": "gemini",
"glm": "glm",

Copilot uses AI. Check for mistakes.
"openai": {"OPENAI_API_KEY", "OPENAI_BASE_URL"},
"openrouter": {"OPENROUTER_API_KEY", "OPENROUTER_BASE_URL"},
"gemini": {"GEMINI_API_KEY", "GEMINI_BASE_URL"},
"zhipu": {"ZHIPU_API_KEY", "ZHIPU_BASE_URL"},
Copy link

Copilot AI Apr 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

EnvCredentials() only checks providerEnvVars, but providerEnvVars has a "zhipu" entry and no corresponding "glm" entry. This means settings.provider="glm" (or any flow that queries env creds for "glm") will never pick up ZHIPU_API_KEY / ZHIPU_BASE_URL. Consider adding a "glm" mapping (likely to the same env vars) or otherwise ensuring env fallback works for the provider type name as well.

Suggested change
"zhipu": {"ZHIPU_API_KEY", "ZHIPU_BASE_URL"},
"zhipu": {"ZHIPU_API_KEY", "ZHIPU_BASE_URL"},
"glm": {"ZHIPU_API_KEY", "ZHIPU_BASE_URL"},

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants